Computationally efficient banding of large covariance matrices for ordered data and connections to banding the inverse Cholesky factor

نویسندگان

  • Y. Wang
  • M. J. Daniels
چکیده

In this article, we propose a computationally efficient approach to estimate (large) p-dimensional covariance matrices of ordered (or longitudinal) data based on an independent sample of size n. To do this, we construct the estimator based on a k-band partial autocorrelation matrix with the number of bands chosen using an exact multiple hypothesis testing procedure. This approach is considerably faster than many existing methods and only requires inversion of (k + 1)-dimensional covariance matrices. The resulting estimator is positive definite as long as k < n (where p can be larger than n). We make connections between this approach and banding the Cholesky factor of the modified Cholesky decomposition of the inverse covariance matrix (Wu and Pourahmadi, 2003) and show that the maximum likelihood estimator of the k-band partial autocorrelation matrix is the same as the k-band inverse Cholesky factor. We evaluate our estimator via extensive simulations and illustrate the approach using high-dimensional sonar data.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Estimation of Large Covariance Matrices via a Nested Lasso Penalty by Elizaveta Levina,1 Adam Rothman

The paper proposes a new covariance estimator for large covariance matrices when the variables have a natural ordering. Using the Cholesky decomposition of the inverse, we impose a banded structure on the Cholesky factor, and select the bandwidth adaptively for each row of the Cholesky factor, using a novel penalty we call nested Lasso. This structure has more flexibility than regular banding, ...

متن کامل

Sparse Estimation of Large Covariance Matrices via a Nested Lasso Penalty

The paper proposes a new covariance estimator for large covariance matrices when the variables have a natural ordering. Using the Cholesky decomposition of the inverse, we impose a banded structure on the Cholesky factor, and select the bandwidth adaptively for each row of the Cholesky factor, using a novel penalty we call nested Lasso. This structure has more flexibility than regular banding, ...

متن کامل

A new approach to Cholesky-based covariance regularization in high dimensions

In this paper we propose a new regression interpretation of the Cholesky factor of the covariance matrix, as opposed to the well-known regression interpretation of the Cholesky factor of the inverse covariance, which leads to a new class of regularized covariance estimators suitable for high-dimensional problems. Regularizing the Cholesky factor of the covariance via this regression interpretat...

متن کامل

M ay 2 00 8 Estimation of Large Precision Matrices Through Block Penalization ∗

This paper focuses on exploring the sparsity of the inverse covariance matrix Σ −1 , or the precision matrix. We form blocks of parameters based on each off-diagonal band of the Cholesky factor from its modified Cholesky decomposition, and penalize each block of parameters using the L 2-norm instead of individual elements. We develop a one-step estimator, and prove an oracle property which cons...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Journal of multivariate analysis

دوره 130  شماره 

صفحات  -

تاریخ انتشار 2014